Syntax Role for Neural Semantic Role Labeling
نویسندگان
چکیده
Semantic role labeling (SRL) is dedicated to recognizing the semantic predicate-argument structure of a sentence. Previous studies in terms traditional models have shown syntactic information can make remarkable contributions SRL performance; however, necessity was challenged by few recent neural that demonstrate impressive performance without backbones and suggest syntax becomes much less important for labeling, especially when paired with deep network large-scale pre-trained language models. Despite this notion, field still lacks systematic full investigation on relevance SRL, both dependency monolingual multilingual settings. This paper intends quantify importance learning framework. We introduce three typical frameworks (baselines), sequence-based, tree-based, graph-based, which are accompanied two categories exploiting information: pruning-based feature-based. Experiments conducted CoNLL-2005, -2009, -2012 benchmarks all languages available, results show benefit from under certain conditions. Furthermore, we quantitative significance together thorough empirical survey using existing
منابع مشابه
Semantic Role Labeling using Dependency Syntax
This document gives a brief introduction to the topic of Semantic Role Labeling using Dependency Syntax. We also describe a system that has been developed and tested on a corpus from the CoNLL-20081 shared task. We evaluate the system and give a short discussion on further improvements. Our results are reasonably good compared to those reached during the shared task.
متن کاملSyntax Aware LSTM model for Semantic Role Labeling
In Semantic Role Labeling (SRL) task, the tree structured dependency relation is rich in syntax information, but it is not well handled by existing models. In this paper, we propose Syntax Aware Long Short Time Memory (SA-LSTM). The structure of SA-LSTM changes according to dependency structure of each sentence, so that SA-LSTM can model the whole tree structure of dependency relation in an arc...
متن کاملSyntax Aware LSTM Model for Chinese Semantic Role Labeling
As for semantic role labeling (SRL) task, when it comes to utilizing parsing information, both traditional methods and recent recurrent neural network (RNN) based methods use the feature engineering way. In this paper, we propose Syntax Aware Long Short Time Memory(SALSTM). The structure of SA-LSTM modifies according to dependency parsing information in order to model parsing information direct...
متن کاملA Simple and Accurate Syntax-Agnostic Neural Model for Dependency-based Semantic Role Labeling
We introduce a simple and accurate neural model for dependency-based semantic role labeling. Our model predicts predicate-argument dependencies relying on states of a bidirectional LSTM encoder. The semantic role labeler achieves competitive performance on English, even without any kind of syntactic information and only using local inference. However, when automatically predicted partof-speech ...
متن کاملSemantic Role Labeling with Neural Network Factors
We present a new method for semantic role labeling in which arguments and semantic roles are jointly embedded in a shared vector space for a given predicate. These embeddings belong to a neural network, whose output represents the potential functions of a graphical model designed for the SRL task. We consider both local and structured learning methods and obtain strong results on standard PropB...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Linguistics
سال: 2021
ISSN: ['1530-9312', '0891-2017']
DOI: https://doi.org/10.1162/coli_a_00408